Spectral projected subgradient method for nonsmooth convex optimization problems

نویسندگان

چکیده

We consider constrained optimization problems with a nonsmooth objective function in the form of mathematical expectation. The Sample Average Approximation (SAA) is used to estimate and variable sample size strategy employed. proposed algorithm combines an SAA subgradient spectral coefficient order provide suitable direction which improves performance first method as shown by numerical results. step sizes are chosen from predefined interval almost sure convergence proved under standard assumptions stochastic environment. To enhance algorithm, we further specify choice introducing Armijo-like procedure adapted this framework. Considering computational cost on machine learning problems, conclude that line search significantly. Numerical experiments conducted finite sum also reveal outperforms full approach.

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

On the projected subgradient method for nonsmooth convex optimization in a Hilbert space

We consider the method for constrained convex optimization in a Hilbert space, consisting of a step in the direction opposite to an ek-subgradient of the objective at a current iterate, followed by an orthogonal projection onto the feasible set. The normalized stepsizes ek are exogenously given, satisfying ~ = 0 c~k ec, ~ = 0 c~ < ec, and ek is chosen so that ek ~ 0. We prove t...

متن کامل

Randomized Block Subgradient Methods for Convex Nonsmooth and Stochastic Optimization

Block coordinate descent methods and stochastic subgradient methods have been extensively studied in optimization and machine learning. By combining randomized block sampling with stochastic subgradient methods based on dual averaging ([22, 36]), we present stochastic block dual averaging (SBDA)—a novel class of block subgradient methods for convex nonsmooth and stochastic optimization. SBDA re...

متن کامل

Mirror descent and nonlinear projected subgradient methods for convex optimization

The mirror descent algorithm (MDA) was introduced by Nemirovsky and Yudin for solving convex optimization problems. This method exhibits an e3ciency estimate that is mildly dependent in the decision variables dimension, and thus suitable for solving very large scale optimization problems. We present a new derivation and analysis of this algorithm. We show that the MDA can be viewed as a nonline...

متن کامل

Convergent Subgradient Methods for Nonsmooth Convex Minimization

In this paper, we develop new subgradient methods for solving nonsmooth convex optimization problems. These methods are the first ones, for which the whole sequence of test points is endowed with the worst-case performance guarantees. The new methods are derived from a relaxed estimating sequences condition, which allows reconstruction of the approximate primal-dual optimal solutions. Our metho...

متن کامل

Parallel Subgradient Method for Nonsmooth Convex Optimization with a Simple Constraint

In this paper, we consider the problem of minimizing the sum of nondifferentiable, convex functions over a closed convex set in a real Hilbert space, which is simple in the sense that the projection onto it can be easily calculated. We present a parallel subgradient method for solving it and the two convergence analyses of the method. One analysis shows that the parallel method with a small con...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Numerical Algorithms

سال: 2022

ISSN: ['1017-1398', '1572-9265']

DOI: https://doi.org/10.1007/s11075-022-01419-3